“What if your Fitbit knew exactly what to say on a particular day to motivate you to get off the couch and run a 5K?” - Persado, Schwab 2017

Introduction

New Tech and Mass Manipulation

Concerns regarding potential mass manipulation arise as new technologies use our data to predict and monetize on our behaviors. There are quite a few examples (links below).

Animated GIF

“Uber redesigns app to predict where riders are headed and give them more to do in the car”

The Facebook Experiment

FitBit Data Concerns

Facebook and Cambridge Analytica (Trailer)

Key Concepts:

Hypernudging: “drawing in Big Data to nudge individuals with personalized feedback to change their behavior” (Lanzing 2019)

Self-tracking: “Fuelled by real-time data, algorithms create personalized online choice architectures that aim to nudge individual users to effectively change their behavior.” (Lanzing 2019)

Research Paper Outline

Main goal: An ethical critique of self-tracking technologies and their data-collection-based algorithmic hypernudging.

Theoretical Framework: decisional privacy and informational privacy as complementary dimensions

Claim: self-tracking technologies and their hyper-nudging threatens individuals’ autonomy due to the fact that they violate both decisional and informational privacy.

Four Steps:

Nudging v.s. Hypernudging (1.1 - 1.4)

Re-evaluation of decisional privacy (1.5 -1.6)

Combination of informational privacy and decisional privacy (1.7 - 1.9)

Three potential objections (1.10)

Main Body

Self-Tracking

Also known as: “life-logging, quantified self, personal analytics, and personal informatics” (Lanzing 2019)

Definition: “the practice of quantifying behavior through extensive self-surveillance for the purpose of behavioral change” (Lanzing 2019)

How-to: wearable digital devices and/or smartphone apps + social media (external online platforms)

Examples:

Strava: self-tracking & social network for athletes

Other Fitness apps: FitBit, Runkeeper

Other self-tracking apps: DrinkLess, SleepCycle, Lose it, SexPositive, What to Expect, 23andMe, MySugr.

Main attraction: personalized feedback and thus self-improvement (and empowerment)

Main concern: Big Data-driven personalized choice architectures

“Big Data has enabled “personalized” choice architectures: choice architectures that are designed according to user data feedback. Personalized feedback in self-tracking is based on the analysis of large aggregates of (personal) information or “Big Data,” also referred to as personal analytics. The analysis aims to identify patterns and interesting correlations in the data. Based on the analysis, many devices, and apps make suggestions to their users about how they can change or improve their behavior, what choices to make.” (Lanzing 2019)

These are some interesting articles I found regarding self-tracking, feel free to take a look around!

The Dark Side of Self-Tracking

‘The bot asked me four times a day how I was feeling’: is tracking everything actually good for us?

Everybody is a self-tracker

Nudging and Hypernudging

So, nudging is when websites try to gently push you towards certain choices. Hypernudging? That’s like nudging on steroids! We’ll talk about how these techniques affect your privacy.

Animated GIF

Decisional and Informational Privacy

Decisional privacy is about your freedom to make choices. Informational privacy is about who gets to know what about you. We’ll explore the differences and how they relate to the paper.

Animated GIF

Conclusion

In conclusion, this paper shows us how important it is to be aware of privacy issues in the digital world. Let’s all be more mindful of our online choices! Thanks for checking out my awesome website!

Animated GIF